On the Entropy Region of Gaussian Random Variables
نویسندگان
چکیده
Given n (discrete or continuous) random variables Xi, the (2 n − 1)-dimensional vector obtained by evaluating the joint entropy of all non-empty subsets of {X1,. .. , Xn} is called an entropic vector. Determining the region of entropic vectors is an important open problem with many applications in information theory. Recently, it has been shown that the entropy regions for discrete and continuous random variables, though different, can be determined from one another. An important class of continuous random variables are those that are vector-valued and jointly Gaussian. It is known that Gaussian random variables violate the Ingleton bound, which many random variables such as those obtained from linear codes over finite fields do satisfy, and they also achieve certain non-Shannon type inequalities. In this paper we give a full characterization of the convex cone of the entropy region of three jointly Gaussian vector-valued random variables and prove that it is the same as the convex cone of three scalar-valued Gaussian random variables and further that it yields the entire entropy region of 3 arbitrary random variables. We further determine the actual entropy region of 3 vector-valued jointly Gaussian random variables through a conjecture. For n ≥ 4 number of random variables, we point out a set of 2 n − 1 − n(n+1) 2 minimal necessary and sufficient conditions that 2 n − 1 numbers must satisfy in order to correspond to the entropy vector of n scalar jointly Gaussian random variables. This improves on a result of Holtz and Sturmfels which gave a nonminimal set of conditions. These constraints are related to Cayley's hyperdeterminant and hence with an eye towards characterizing the entropy region of jointly Gaussian random variables, we also present some new results in this area. We obtain a new (determinant) formula for the 2 × 2 × 2 hyperdeterminant and we also give a new (transparent) proof of the fact that the principal minors of an n × n symmetric matrix satisfy the 2 × 2 ×. .. × 2 (up to n times) hyperdeterminant relations.
منابع مشابه
ADK Entropy and ADK Entropy Rate in Irreducible- Aperiodic Markov Chain and Gaussian Processes
In this paper, the two parameter ADK entropy, as a generalized of Re'nyi entropy, is considered and some properties of it, are investigated. We will see that the ADK entropy for continuous random variables is invariant under a location and is not invariant under a scale transformation of the random variable. Furthermore, the joint ADK entropy, conditional ADK entropy, and chain rule of this ent...
متن کاملComplete convergence of moving-average processes under negative dependence sub-Gaussian assumptions
The complete convergence is investigated for moving-average processes of doubly infinite sequence of negative dependence sub-gaussian random variables with zero means, finite variances and absolutely summable coefficients. As a corollary, the rate of complete convergence is obtained under some suitable conditions on the coefficients.
متن کاملThe Rate of Entropy for Gaussian Processes
In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...
متن کاملStrong Convergence of Weighted Sums for Negatively Orthant Dependent Random Variables
We discuss in this paper the strong convergence for weighted sums of negatively orthant dependent (NOD) random variables by generalized Gaussian techniques. As a corollary, a Cesaro law of large numbers of i.i.d. random variables is extended in NOD setting by generalized Gaussian techniques.
متن کاملThe rates of convergence for generalized entropy of the normalized sums of IID random variables
We consider the generalized differential entropy of normalized sums of independent and identically distributed (IID) continuous random variables. We prove that the Rényi entropy and Tsallis entropy of order α (α > 0) of the normalized sum of IID continuous random variables with bounded moments are convergent to the corresponding Rényi entropy and Tsallis entropy of the Gaussian limit, and obtai...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1112.0061 شماره
صفحات -
تاریخ انتشار 2011